78 research outputs found

    A new perspective on Workload Control by measuring operating performances through an economic valorization

    Get PDF
    Workload Control (WLC) is a production planning and control system conceived to reduce queuing times of job-shop systems, and to offer a solution to the lead time syndrome; a critical issue that often bewilders make-to-order manufacturers. Nowadays, advantages of WLC are unanimously acknowledged, but real successful stories are still limited. This paper starts from the lack of a consistent way to assess performance of WLC, an important burden for its acceptance in the industry. As researchers often put more focus on the performance measures that better confirm their hypotheses, many measures, related to different WLC features, have emerged over years. However, this excess of measures may even mislead practitioners, in the evaluation of alternative production planning and control systems. To close this gap, we propose quantifying the main benefit of WLC in economic terms, as this is the easiest, and probably only way, to compare different and even conflicting performance measures. Costs and incomes are identified and used to develop an overall economic measure that can be used to evaluate, or even to fine tune, the operating features of WLC. The quality of our approach is finally demonstrated via simulation, considering the 6-machines job-shop scenario typically adopted as benchmark in technical literature

    A measurement method of routing flexibility in manufacturing systems

    Get PDF
    Article history: Received 27 January 2011 Received in revised form 25 February 2011 Accepted 2 March 2011 Available online 3 March 2011 This paper focuses on routing flexibility, which is the ability to manufacture a part type via several routes and/or to perform different operations on more than one machine. Specifically, the paper presents a comprehensive method for the measurement of routing flexibility, in a generic manufacturing system. The problem is approached in a modular way, starting from a basic set of flexibility indexes. These are progressively extended to include more comprehensive and complex routing attributes, such as: the average efficiency, the range and the homogeneous distribution of the alternative routes. Two procedures are finally proposed to compare manufacturing systems in terms of routing flexibility. The first one uses a vectorial representation of the previously defined indexes and the second one is based on data envelopment analysis, a multi-criteria decision making approach. The paper concludes with a numerical example, supported by discrete event simulation, which validates the proposed approach. © 2011 Growing Science Ltd. All rights reserve

    Comparison of new metaheuristics, for the solution of an integrated jobs-maintenance scheduling problem

    Get PDF
    This paper presents and compares new metaheuristics to solve an integrated jobs-maintenance scheduling problem, on a single machine subjected to aging and failures. The problem, introduced by Zammori et al. (2014), was originally solved using the Modified Harmony Search (MHS) metaheuristic. However, an extensive numerical analysis brought to light some structural limits of the MHS, as the analysis revealed that the MHS is outperformed by the simpler Simulated Annealing by Ishibuchi et al. (1995). Aiming to solve the problem in a more effective way, we integrated the MHS with local minima escaping procedures and we also developed a new Cuckoo Search metaheuristic, based on an innovative Levy Flight. A thorough comparison confirmed the superiority of the newly developed Cuckoo Search, which is capable to find better solutions in a smaller amount of time. This an important result, both for academics and practitioners, since the integrated job-maintenance scheduling problem has a high operational relevance, but it is known to be extremely hard to be solved, especially in a reasonable amount of time. Also, the developed Cuckoo Search has been designed in an extremely flexible way and it can be easily readapted and applied to a wide range of combinatorial problems. (C) 2018 Elsevier Ltd. All rights reserved

    Defining accurate delivery dates in make to order job-shops managed by workload control

    Get PDF
    Workload control (WLC) is a lean oriented system that reduces queues and waiting times, by imposing a cap to the workload released to the shop floor. Unfortunately, WLC performance does not systematically outperform that of push operating systems, with undersaturated utilizations levels and optimized dispatching rules. To address this issue, many scientific works made use of complex job-release mechanisms and sophisticated dispatching rules, but this makes WLC too complicated for industrial applications. So, in this study, we propose a complementary approach. At first, to reduce queuing time variability, we introduce a simple WLC system; next we integrate it with a predictive tool that, based on the system state, can accurately forecast the total time needed to manufacture and deliver a job. Due to the non-linearity among dependent and independent variables, forecasts are made using a multi-layer-perceptron; yet, to have a comparison, the effectiveness of both linear and non-linear multi regression model has been tested too. Anyhow, if due dates are endogenous (i.e. set by the manufacturer), they can be directly bound to this internal estimate. Conversely, if they are exogenous (i.e. set by the customer), this approach may not be enough to minimize the percentage of tardy jobs. So, we also propose a negotiation scheme, which can be used to extend exogenous due dates considered too tight, with respect to the internal estimate. This is the main contribution of the paper, as it makes the forecasting approach truly useful in many industrial applications. To test our approach, we simulated a 6-machines job-shop controlled with WLC and equipped with the proposed forecasting system. Obtained performances, namely WIP levels, percentage of tardy jobs and negotiated due dates, were compared with those of a set classical benchmark, and demonstrated the robustness and the quality of our approach, which ensures minimal delays

    An Exploratory Research on Adaptability and Flexibility of a Serious Game in Operations and Supply Chain Management

    Get PDF
    Serious games (SGs) in industrial engineering education are an established topic, whose implementations are continuously growing. In particular, they are recognized as effective tools to teach and learn subjects like Operations and Supply Chain Management. The research on SGs, however, is primarily focused on displaying applications and teaching results of particular games to achieve given purposes. In this paper, we provide an exploratory research on the flexibility and adaptability of a specific SG to different target groups and students’ needs in the field of operations and supply chain management. We first provide an overview of the SG and introduce its mechanics. Next, we explain how the mechanics has been implemented, by means of a set of parameters and indicators. We report the results of two different game sessions, played by a class of bachelor’s degree students at different levels of difficulty, which were achieved by altering some specific game parameters. By comparing the Key Performance Indicators (KPIs) in the two sessions, we report and discuss the consequences of the modified game parameters, in terms of impact on the difficulty level of the SG measured by the indicators. Experimental results match with our hypothesis, since the increased level of difficulty of sourcing and delivery times only deteriorates the related subset of indicators in the harder game session, without altering the remaining KPIs

    Abraded Glass Strength: An Ad Hoc Fitting Protocol Based on the Change of Variable Theorem

    Get PDF
    This work tackles the problem of finding a suitable statistical model to describe relevant glass properties, such as the strength under tensile stress. As known, glass is a brittle material, whose strength is strictly related to the presence of microcracks on its surface. The main issue is that the number of cracks, their size, and orientation are of random nature, and they may even change over time, due to abrasion phenomena. Consequently, glass strength should be statistically treated, but unfortunately none of the known probability distributions properly fit experimental data, when measured on abraded and/or aged glass panes. Owing to these issues, this paper proposes an innovative method to analyze the statistical properties of glass. The method takes advantage of the change of variable theorem and uses an ad-hoc transforming function to properly account for the distortion, on the original probability distribution of the glass strength, induced by the abrasion process. The adopted transforming function is based on micromechanical theory, and it provides an optimal fit of the experimental data

    Fuzzy Overall Equipment Effectiveness (FOEE): capturing performance fluctuations through LR Fuzzy numbers

    No full text
    The paper focuses on the Overall Equipment Effectiveness (OEE), a performance indicator that is extensively used in the industry. The aim is to extend the capabilities of the OEE, so as to capture the day-to-day fluctuations to which manufacturing performances are subjected. To this aim, manufacturing losses are decomposed into elementary causes and modelled as LR fuzzy numbers. Next, in order to compute the Fuzzy Overall Equipment Effectiveness (FOEE), single losses are aggregated using the ‘fuzzy transformation model’. This approach limits the fuzzy overestimation phenomenon and assures both results’ accuracy and robustness. An industrial application, part of a lean project carried on by an important Italian manufacturing firm, is finally presented. Results are encouraging, since the FOEE made it possible to trace back the share of the overall fluctuations that is ascribable to each cause of loss. In this way, it provided the basis for setting improvement priorities and directed the lean team toward the selection of appropriate corrective actions

    ANP/RPN: A Multi Criteria Evaluation of the Risk Priority Number

    No full text
    This paper presents an advanced version of the failure mode effects and criticality analysis (FMECA), whose capabilities are enhanced; in that the criticality assessment takes into account possible interactions among the principal causes of failure. This is obtained by integrating FMECA and Analytic Network Process, a multi-criteria decision making technique. Severity, Occurrence and Detectability are split into sub-criteria and arranged in a hybrid (hierarchy/network) decision-structure that, at the lowest level, contains the causes of failure. Starting from this decision-structure, the Risk Priority Number is computed making pairwise comparisons, so that qualitative judgements and reliable quantitative data can be easily included in the analysis, without using vague and unreliable linguistic conversion tables. Pairwise comparison also facilitates the effort of the design/maintenance team, since it is easier to place comparative rather than absolute judgments, to quantify the importance of the causes of failure. In order to clarify and to make evident the rational of the final results, a graphical tool, similar to the House of Quality, is also presented. At the end of the paper, a case study, which confirms the quality of the approach and shows its capability to perform robust and comprehensive criticality analyses, is reporte

    Exploiting Machine Learning and Industry 4.0 traceability technologies to re-engineering the seasoning process of traditional Parma's Ham

    No full text
    The work presents a Machine Learning approach for predicting the quality of the curing process of Parma ham, combined with a study of business process re-engineering, based on RFID and Deep Learning technologies for automatic recognition and tracking of the hams along the curing process. Quality management has proven to be crucial for efficient and effective processes, even more so for the food industry, both for commercial and regulatory purposes. This is even more evident in artisanal-based processes, such as the one concerning traditional Prosciutto di Parma seasoning. The work proposes and compares a Feed-Forward Neural Network and a Random Forest for predicting the distribution of the number of hams by commercial quality class of a given aging lot. Such a prediction, based on origin, process, and curing data, can provide early indications of process output, enabling strategic commercial competitive advantages. The importance of the genetic component in the determination of the final quality is also evaluated, as it is considered one of the most influential external variables. Moreover, following the AS-IS description of the current process, a redesign is proposed, to enable data collection and tracking of individual ham in order to propose a future precision prediction system that would allow even finer control of the process
    • …
    corecore